Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
BMC Med Educ ; 24(1): 448, 2024 Apr 24.
Article in English | MEDLINE | ID: mdl-38658906

ABSTRACT

OBJECTIVES: This study aimed to investigate the utility of the RAND/UCLA appropriateness method (RAM) in validating expert consensus-based multiple-choice questions (MCQs) on electrocardiogram (ECG). METHODS: According to the RAM user's manual, nine panelists comprising various experts who routinely handle ECGs were asked to reach a consensus in three phases: a preparatory phase (round 0), an online test phase (round 1), and a face-to-face expert panel meeting (round 2). In round 0, the objectives and future timeline of the study were elucidated to the nine expert panelists with a summary of relevant literature. In round 1, 100 ECG questions prepared by two skilled cardiologists were answered, and the success rate was calculated by dividing the number of correct answers by 9. Furthermore, the questions were stratified into "Appropriate," "Discussion," or "Inappropriate" according to the median score and interquartile range (IQR) of appropriateness rating by nine panelists. In round 2, the validity of the 100 ECG questions was discussed in an expert panel meeting according to the results of round 1 and finally reassessed as "Appropriate," "Candidate," "Revision," and "Defer." RESULTS: In round 1 results, the average success rate of the nine experts was 0.89. Using the median score and IQR, 54 questions were classified as " Discussion." In the expert panel meeting in round 2, 23% of the original 100 questions was ultimately deemed inappropriate, although they had been prepared by two skilled cardiologists. Most of the 46 questions categorized as "Appropriate" using the median score and IQR in round 1 were considered "Appropriate" even after round 2 (44/46, 95.7%). CONCLUSIONS: The use of the median score and IQR allowed for a more objective determination of question validity. The RAM may help select appropriate questions, contributing to the preparation of higher-quality tests.


Subject(s)
Electrocardiography , Humans , Consensus , Reproducibility of Results , Clinical Competence/standards , Educational Measurement/methods , Cardiology/standards
2.
BMJ Open ; 13(5): e072097, 2023 05 23.
Article in English | MEDLINE | ID: mdl-37221035

ABSTRACT

INTRODUCTION: Although the ECG is an important diagnostic tool in medical practice, the competency of ECG interpretation is considered to be poor. Diagnostic inaccuracy involving the misinterpretation of ECG can lead to inappropriate medical judgements and cause negative clinical outcomes, unnecessary medical testing and even fatalities. Despite the importance of assessing ECG interpretation skills, there is currently no established universal, standardised assessment tool for ECG interpretation. The current study seeks to (1) develop a set of items (ECG questions) for estimating competency of ECG interpretation by medical personnel by consensus among expert panels following a process based on the RAND/UCLA Appropriateness Method (RAM) and (2) analyse item parameters and multidimensional latent factors of the test set to develop an assessment tool. METHODS AND ANALYSIS: This study will be conducted in two steps: (1) selection of question items for ECG interpretation assessment by expert panels via a consensus process following RAM and (2) cross-sectional, web-based testing using a set of ECG questions. A multidisciplinary panel of experts will evaluate the answers and appropriateness and select 50 questions as the next step. Based on data collected from a predicted sample size of 438 test participants recruited from physicians, nurses, medical and nursing students, and other healthcare professionals, we plan to statistically analyse item parameters and participant performance using multidimensional item response theory. Additionally, we will attempt to detect possible latent factors in the competency of ECG interpretation. A test set of question items for ECG interpretation will be proposed on the basis of the extracted parameters. ETHICS AND DISSEMINATION: The protocol of this study was approved by the Institutional Review Board of Ehime University Graduate School of Medicine (IRB number: 2209008). We will obtain informed consent from all participants. The findings will be submitted for publication in peer-reviewed journals.


Subject(s)
Ethics Committees, Research , Fishes , Humans , Animals , Consensus , Cross-Sectional Studies , Electrocardiography
SELECTION OF CITATIONS
SEARCH DETAIL
...